Bregman divergence as relative operator entropy
ثبت نشده
چکیده
vet C e onvex set in fnh spe nd r e rilert speF por n opertorE vlued smooth funtionl © X C 3 B@rAD we dene the Bregman divergence of x; y P C s D © @x; yA Xa ©@xA ©@yA lim t3CH t I ©@y C t@x yAA ©@yA : ell tht originlly vev fregmn introdued this onept for relEvlued onvex funE tions © PF yn the selfEdjoint opertorsD there is stndrd prtil orderingD A BD if h; Ai h; Bi for ny vetor in the rilert speF rene the onvexity of © X C 3 B@rA mkes sense nd D © @x; yA ! H remins trueF e re silly interested in the se when C is the set of positive semidenite mtries of tre I nd ©@xA a @xAD where @tA a t log t is ommon funtion dened on R C nd @xA is dened y the funtionl lulusF roweverD we lso onsider ©@xA a f @xA for generl smooth funtion f F ememer tht positive semidenite mtries of tre I re known s density mtriE es nd ply entrl role in quntum informtion theory TD UF gonerning megki9s quntum reltive entropy ghpF I of U is our min refereneF 1
منابع مشابه
Parametric Bayesian Estimation of Differential Entropy and Relative Entropy
Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entrop...
متن کاملRe-examination of Bregman functions and new properties of their divergences
The Bregman divergence (Bregman distance, Bregman measure of distance) is a certain useful substitute for a distance, obtained from a well-chosen function (the “Bregman function”). Bregman functions and divergences have been extensively investigated during the last decades and have found applications in optimization, operations research, information theory, nonlinear analysis, machine learning ...
متن کاملSome properties of the parametric relative operator entropy
The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...
متن کاملRobustness of the second law of thermodynamics under generalizations of the maximum entropy method
It is shown that the laws of thermodynamics are extremely robust under generalizations of the form of entropy. Using the Bregman-type relative entropy, the Clausius inequality is proved to be always valid. This implies that thermodynamics is highly universal and does not rule out consistent generalization of the maximum entropy method.
متن کاملConvex Foundations for Generalized MaxEnt Models
We present an approach to maximum entropy models that highlights the convex geometry and duality of GEFs and their connection to Bregman divergences. Using our framework, we are able to resolve a puzzling aspect of the bijection of [1] between classical exponential families and what they call regular Bregman divergences. Their regularity condition rules out all but Bregman divergences generated...
متن کاملWorst-Case and Smoothed Analysis of k-Means Clustering with Bregman Divergences
The k-means algorithm is the method of choice for clustering large-scale data sets and it performs exceedingly well in practice. Most of the theoretical work is restricted to the case that squared Euclidean distances are used as similarity measure. In many applications, however, data is to be clustered with respect to other measures like, e.g., relative entropy, which is commonly used to cluste...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006